Goto

Collaborating Authors

 aria gen 2


Dexterity from Smart Lenses: Multi-Fingered Robot Manipulation with In-the-Wild Human Demonstrations

Guzey, Irmak, Qi, Haozhi, Urain, Julen, Wang, Changhao, Yin, Jessica, Bodduluri, Krishna, Lambeta, Mike, Pinto, Lerrel, Rai, Akshara, Malik, Jitendra, Wu, Tingfan, Sharma, Akash, Bharadhwaj, Homanga

arXiv.org Artificial Intelligence

Learning multi-fingered robot policies from humans performing daily tasks in natural environments has long been a grand goal in the robotics community. Achieving this would mark significant progress toward generalizable robot manipulation in human environments, as it would reduce the reliance on labor-intensive robot data collection. Despite substantial efforts, progress toward this goal has been bottle-necked by the embodiment gap between humans and robots, as well as by difficulties in extracting relevant contextual and motion cues that enable learning of autonomous policies from in-the-wild human videos. We claim that with simple yet sufficiently powerful hardware for obtaining human data and our proposed framework AINA, we are now one significant step closer to achieving this dream. AINA enables learning multi-fingered policies from data collected by anyone, anywhere, and in any environment using Aria Gen 2 glasses. These glasses are lightweight and portable, feature a high-resolution RGB camera, provide accurate on-board 3D head and hand poses, and offer a wide stereo view that can be leveraged for depth estimation of the scene. This setup enables the learning of 3D point-based policies for multi-fingered hands that are robust to background changes and can be deployed directly without requiring any robot data (including online corrections, reinforcement learning, or simulation). We compare our framework against prior human-to-robot policy learning approaches, ablate our design choices, and demonstrate results across nine everyday manipulation tasks. Robot rollouts are best viewed on our website: https://aina-robot.github.io.


Aria Gen 2 Pilot Dataset

Kong, Chen, Fort, James, Kang, Aria, Wittmer, Jonathan, Green, Simon, Shen, Tianwei, Zhao, Yipu, Peng, Cheng, Solaira, Gustavo, Berkovich, Andrew, Raina, Nikhil, Baiyya, Vijay, Oleinik, Evgeniy, Huang, Eric, Zhang, Fan, Straub, Julian, Schwesinger, Mark, Pesqueira, Luis, Pan, Xiaqing, Engel, Jakob Julian, Ren, Carl, Yan, Mingfei, Newcombe, Richard

arXiv.org Artificial Intelligence

The Aria Gen 2 Pilot Dataset (A2PD) is an egocentric multimodal open dataset captured using the state-of-the-art Aria Gen 2 glasses. To facilitate timely access, A2PD is released incrementally with ongoing dataset enhancements. The initial release features Dia'ane, our primary subject, who records her daily activities alongside friends, each equipped with Aria Gen 2 glasses. It encompasses five primary scenarios: cleaning, cooking, eating, playing, and outdoor walking. In each of the scenarios, we provide comprehensive raw sensor data and output data from various machine perception algorithms. These data illustrate the device's ability to perceive the wearer, the surrounding environment, and interactions between the wearer and the environment, while maintaining robust performance across diverse users and conditions. The A2PD is publicly available at projectaria.com, with open-source tools and usage examples provided in Project Aria Tools.


Fox News AI Newsletter: Nvidia joins Trump onshoring push

FOX News

Jensen Huang, co-founder and CEO of Nvidia Corp., gives a talk in Taipei, Taiwan. Nvidia CEO Jensen Huang delivers a keynote address during the Nvidia GTC Artificial Intelligence Conference at SAP Center, March 18, 2024, in San Jose, California. STACKING CHIPS: Nvidia CEO Jensen Huang said Wednesday that the leading artificial intelligence chipmaker will invest hundreds of billions of dollars in the U.S. supply chain over the next four years. SPOT THE AI LIE: It's becoming more common for images to be made with AI tools. As the artificial intelligence generation gets more advanced, it's getting trickier to tell the difference between AI-made and human-made images. However, there are still signs to look out for.


Meta unveils new AR glasses with heart rate monitoring

FOX News

The glasses' sensor technology opens up new possibilities for research and development in augmented reality applications. Get ready for some amazing tech that's about to change the way we see the world, literally. Meta has just unveiled its latest creation, the Aria Gen 2 augmented reality (AR) glasses. But don't rush out to get them just yet. Aria Gen 2 is currently in research mode but is designed to push the boundaries of what's possible with AR and AI.